Skip to content

Conversation

@maycmlee
Copy link
Contributor

What does this PR do? What is the motivation?

Merge instructions

Merge readiness:

  • Ready for merge

For Datadog employees:

Your branch name MUST follow the <name>/<description> convention and include the forward slash (/). Without this format, your pull request will not pass CI, the GitLab pipeline will not run, and you won't get a branch preview. Getting a branch preview makes it easier for us to check any issues with your PR, such as broken links.

If your branch doesn't follow this format, rename it or create a new branch and PR.

[6/5/2025] Merge queue has been disabled on the documentation repo. If you have write access to the repo, the PR has been reviewed by a Documentation team member, and all of the required checks have passed, you can use the Squash and Merge button to merge the PR. If you don't have write access, or you need help, reach out in the #documentation channel in Slack.

Additional notes

@maycmlee maycmlee added the WORK IN PROGRESS No review needed, it's a wip ;) label Nov 12, 2025
@maycmlee maycmlee requested a review from a team as a code owner November 12, 2025 23:22
@github-actions github-actions bot added Architecture Everything related to the Doc backend Guide Content impacting a guide labels Nov 12, 2025
@github-actions
Copy link
Contributor

github-actions bot commented Nov 12, 2025


#### Fixes

- THe HTTP Client source's authorization strategy has been fixed.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Uppercase H in "THe" at the start

#### New features

- [OpenTelemetry Collector source][7]: Send logs from your OpenTelemetry Collector to Observability Pipelines.
- [Datadog CloudPrem destination][8]: Send logs from Observability Pipelines to Datadog CloudPrem.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe change up the wording and say "Route logs to Datadog CloudPrem destination"


#### New features

- [OpenTelemetry Collector source][7]: Send logs from your OpenTelemetry Collector to Observability Pipelines.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Change to "Ingest logs from the OpenTelemetry Collector to Observability Pipelines"


#### Enhancements

- The Elasticsearch destination's indexing strategy has been updated to include data streams.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.


#### New features

- [The HTTP Client destination][1]: Send logs to an HTTP client, such as a logging platform or SIEM.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Capitalize "Client"

#### New features

- [The HTTP Client destination][1]: Send logs to an HTTP client, such as a logging platform or SIEM.
- [Processor Groups][2]: Organize your processors into logical groups to help you manage them.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is linked? Are you not seeing it in the preview?


- [The HTTP Client destination][1]: Send logs to an HTTP client, such as a logging platform or SIEM.
- [Processor Groups][2]: Organize your processors into logical groups to help you manage them.
- [Disk][3] and [memory][4] buffering options are available for destinations.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's linking to the specific Disk and Memory buffering section of that doc. Are you thinking we should just link to the main doc (no anchor links to the sections)?

- The `decode_lz4` custom function has been updated to support decompressing `lz4` frame data.
- The Azure Blob Storage and Google Cloud Storage archive destinations' prefix fields support template syntax.
- The Splunk HEC destination has a custom environment variable.
- The sample processor has an optional `group_by` parameter.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Architecture Everything related to the Doc backend Guide Content impacting a guide WORK IN PROGRESS No review needed, it's a wip ;)

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants